AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
WikiText Pretraining

# WikiText Pretraining

Nystromformer 4096
Long-sequence Nyströmformer model trained on WikiText-103 v1 dataset, supports sequence processing up to 4096 tokens
Large Language Model Transformers
N
uw-madison
74
3
Nystromformer 2048
Nystromformer model trained on the WikiText-103 dataset, supporting long sequence processing (2048 tokens)
Large Language Model Transformers
N
uw-madison
38
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase